Model Training Report

Generated for model: models/resnet50_prune20pct_best_model.pth

Model Architecture

Text Summary

----------------------------------------------------------------
        Layer (type)               Output Shape         Param #
================================================================
            Conv2d-1         [-1, 64, 112, 112]           9,408
       BatchNorm2d-2         [-1, 64, 112, 112]             128
         MaxPool2d-3           [-1, 64, 56, 56]               0
            Conv2d-4           [-1, 64, 56, 56]           4,096
       BatchNorm2d-5           [-1, 64, 56, 56]             128
            Conv2d-6           [-1, 64, 56, 56]          36,864
       BatchNorm2d-7           [-1, 64, 56, 56]             128
            Conv2d-8          [-1, 256, 56, 56]          16,384
       BatchNorm2d-9          [-1, 256, 56, 56]             512
           Conv2d-10          [-1, 256, 56, 56]          16,384
      BatchNorm2d-11          [-1, 256, 56, 56]             512
       Bottleneck-12          [-1, 256, 56, 56]               0
           Conv2d-13           [-1, 64, 56, 56]          16,384
      BatchNorm2d-14           [-1, 64, 56, 56]             128
           Conv2d-15           [-1, 64, 56, 56]          36,864
      BatchNorm2d-16           [-1, 64, 56, 56]             128
           Conv2d-17          [-1, 256, 56, 56]          16,384
      BatchNorm2d-18          [-1, 256, 56, 56]             512
       Bottleneck-19          [-1, 256, 56, 56]               0
           Conv2d-20           [-1, 64, 56, 56]          16,384
      BatchNorm2d-21           [-1, 64, 56, 56]             128
           Conv2d-22           [-1, 64, 56, 56]          36,864
      BatchNorm2d-23           [-1, 64, 56, 56]             128
           Conv2d-24          [-1, 256, 56, 56]          16,384
      BatchNorm2d-25          [-1, 256, 56, 56]             512
       Bottleneck-26          [-1, 256, 56, 56]               0
           Conv2d-27          [-1, 128, 56, 56]          32,768
      BatchNorm2d-28          [-1, 128, 56, 56]             256
           Conv2d-29          [-1, 128, 28, 28]         147,456
      BatchNorm2d-30          [-1, 128, 28, 28]             256
           Conv2d-31          [-1, 512, 28, 28]          65,536
      BatchNorm2d-32          [-1, 512, 28, 28]           1,024
           Conv2d-33          [-1, 512, 28, 28]         131,072
      BatchNorm2d-34          [-1, 512, 28, 28]           1,024
       Bottleneck-35          [-1, 512, 28, 28]               0
           Conv2d-36          [-1, 128, 28, 28]          65,536
      BatchNorm2d-37          [-1, 128, 28, 28]             256
           Conv2d-38          [-1, 128, 28, 28]         147,456
      BatchNorm2d-39          [-1, 128, 28, 28]             256
           Conv2d-40          [-1, 512, 28, 28]          65,536
      BatchNorm2d-41          [-1, 512, 28, 28]           1,024
       Bottleneck-42          [-1, 512, 28, 28]               0
           Conv2d-43          [-1, 128, 28, 28]          65,536
      BatchNorm2d-44          [-1, 128, 28, 28]             256
           Conv2d-45          [-1, 128, 28, 28]         147,456
      BatchNorm2d-46          [-1, 128, 28, 28]             256
           Conv2d-47          [-1, 512, 28, 28]          65,536
      BatchNorm2d-48          [-1, 512, 28, 28]           1,024
       Bottleneck-49          [-1, 512, 28, 28]               0
           Conv2d-50          [-1, 128, 28, 28]          65,536
      BatchNorm2d-51          [-1, 128, 28, 28]             256
           Conv2d-52          [-1, 128, 28, 28]         147,456
      BatchNorm2d-53          [-1, 128, 28, 28]             256
           Conv2d-54          [-1, 512, 28, 28]          65,536
      BatchNorm2d-55          [-1, 512, 28, 28]           1,024
       Bottleneck-56          [-1, 512, 28, 28]               0
           Conv2d-57          [-1, 256, 28, 28]         131,072
      BatchNorm2d-58          [-1, 256, 28, 28]             512
           Conv2d-59          [-1, 256, 14, 14]         589,824
      BatchNorm2d-60          [-1, 256, 14, 14]             512
           Conv2d-61         [-1, 1024, 14, 14]         262,144
      BatchNorm2d-62         [-1, 1024, 14, 14]           2,048
           Conv2d-63         [-1, 1024, 14, 14]         524,288
      BatchNorm2d-64         [-1, 1024, 14, 14]           2,048
       Bottleneck-65         [-1, 1024, 14, 14]               0
           Conv2d-66          [-1, 256, 14, 14]         262,144
      BatchNorm2d-67          [-1, 256, 14, 14]             512
           Conv2d-68          [-1, 256, 14, 14]         589,824
      BatchNorm2d-69          [-1, 256, 14, 14]             512
           Conv2d-70         [-1, 1024, 14, 14]         262,144
      BatchNorm2d-71         [-1, 1024, 14, 14]           2,048
       Bottleneck-72         [-1, 1024, 14, 14]               0
           Conv2d-73          [-1, 256, 14, 14]         262,144
      BatchNorm2d-74          [-1, 256, 14, 14]             512
           Conv2d-75          [-1, 256, 14, 14]         589,824
      BatchNorm2d-76          [-1, 256, 14, 14]             512
           Conv2d-77         [-1, 1024, 14, 14]         262,144
      BatchNorm2d-78         [-1, 1024, 14, 14]           2,048
       Bottleneck-79         [-1, 1024, 14, 14]               0
           Conv2d-80          [-1, 256, 14, 14]         262,144
      BatchNorm2d-81          [-1, 256, 14, 14]             512
           Conv2d-82          [-1, 256, 14, 14]         589,824
      BatchNorm2d-83          [-1, 256, 14, 14]             512
           Conv2d-84         [-1, 1024, 14, 14]         262,144
      BatchNorm2d-85         [-1, 1024, 14, 14]           2,048
       Bottleneck-86         [-1, 1024, 14, 14]               0
           Conv2d-87          [-1, 256, 14, 14]         262,144
      BatchNorm2d-88          [-1, 256, 14, 14]             512
           Conv2d-89          [-1, 256, 14, 14]         589,824
      BatchNorm2d-90          [-1, 256, 14, 14]             512
           Conv2d-91         [-1, 1024, 14, 14]         262,144
      BatchNorm2d-92         [-1, 1024, 14, 14]           2,048
       Bottleneck-93         [-1, 1024, 14, 14]               0
           Conv2d-94          [-1, 256, 14, 14]         262,144
      BatchNorm2d-95          [-1, 256, 14, 14]             512
           Conv2d-96          [-1, 256, 14, 14]         589,824
      BatchNorm2d-97          [-1, 256, 14, 14]             512
           Conv2d-98         [-1, 1024, 14, 14]         262,144
      BatchNorm2d-99         [-1, 1024, 14, 14]           2,048
      Bottleneck-100         [-1, 1024, 14, 14]               0
          Conv2d-101          [-1, 512, 14, 14]         524,288
     BatchNorm2d-102          [-1, 512, 14, 14]           1,024
          Conv2d-103            [-1, 512, 7, 7]       2,359,296
     BatchNorm2d-104            [-1, 512, 7, 7]           1,024
          Conv2d-105           [-1, 2048, 7, 7]       1,048,576
     BatchNorm2d-106           [-1, 2048, 7, 7]           4,096
          Conv2d-107           [-1, 2048, 7, 7]       2,097,152
     BatchNorm2d-108           [-1, 2048, 7, 7]           4,096
      Bottleneck-109           [-1, 2048, 7, 7]               0
          Conv2d-110            [-1, 512, 7, 7]       1,048,576
     BatchNorm2d-111            [-1, 512, 7, 7]           1,024
          Conv2d-112            [-1, 512, 7, 7]       2,359,296
     BatchNorm2d-113            [-1, 512, 7, 7]           1,024
          Conv2d-114           [-1, 2048, 7, 7]       1,048,576
     BatchNorm2d-115           [-1, 2048, 7, 7]           4,096
      Bottleneck-116           [-1, 2048, 7, 7]               0
          Conv2d-117            [-1, 512, 7, 7]       1,048,576
     BatchNorm2d-118            [-1, 512, 7, 7]           1,024
          Conv2d-119            [-1, 512, 7, 7]       2,359,296
     BatchNorm2d-120            [-1, 512, 7, 7]           1,024
          Conv2d-121           [-1, 2048, 7, 7]       1,048,576
     BatchNorm2d-122           [-1, 2048, 7, 7]           4,096
      Bottleneck-123           [-1, 2048, 7, 7]               0
AdaptiveAvgPool2d-124           [-1, 2048, 1, 1]               0
          Linear-125                    [-1, 1]           2,049
================================================================
Total params: 23,510,081
Trainable params: 23,510,081
Non-trainable params: 0
----------------------------------------------------------------
Input size (MB): 0.57
Forward/backward pass size (MB): 213.24
Params size (MB): 89.68
Estimated Total Size (MB): 303.50
----------------------------------------------------------------

Sparsity Analysis

========================================================
--- Custom Model Pruning Summary (Sparsity Analysis) ---
========================================================
Layer Name                               |    Total Params |   Non-Zero Params |    Sparsity (%)
------------------------------------------------------------------------------------------------
conv1                                    |           9,408 |             8,706 |           7.46%
layer1.0.conv1                           |           4,096 |             3,869 |           5.54%
layer1.0.conv2                           |          36,864 |            31,262 |          15.20%
layer1.0.conv3                           |          16,384 |            15,491 |           5.45%
layer1.0.shortcut.0                      |          16,384 |            15,496 |           5.42%
layer1.1.conv1                           |          16,384 |            14,637 |          10.66%
layer1.1.conv2                           |          36,864 |            31,446 |          14.70%
layer1.1.conv3                           |          16,384 |            15,515 |           5.30%
layer1.2.conv1                           |          16,384 |            14,624 |          10.74%
layer1.2.conv2                           |          36,864 |            31,304 |          15.08%
layer1.2.conv3                           |          16,384 |            15,484 |           5.49%
layer2.0.conv1                           |          32,768 |            29,357 |          10.41%
layer2.0.conv2                           |         147,456 |           119,158 |          19.19%
layer2.0.conv3                           |          65,536 |            60,536 |           7.63%
layer2.0.shortcut.0                      |         131,072 |           116,917 |          10.80%
layer2.1.conv1                           |          65,536 |            56,108 |          14.39%
layer2.1.conv2                           |         147,456 |           119,213 |          19.15%
layer2.1.conv3                           |          65,536 |            60,541 |           7.62%
layer2.2.conv1                           |          65,536 |            56,197 |          14.25%
layer2.2.conv2                           |         147,456 |           119,319 |          19.08%
layer2.2.conv3                           |          65,536 |            60,627 |           7.49%
layer2.3.conv1                           |          65,536 |            56,010 |          14.54%
layer2.3.conv2                           |         147,456 |           119,499 |          18.96%
layer2.3.conv3                           |          65,536 |            60,650 |           7.46%
layer3.0.conv1                           |         131,072 |           112,401 |          14.24%
layer3.0.conv2                           |         589,824 |           459,206 |          22.15%
layer3.0.conv3                           |         262,144 |           234,201 |          10.66%
layer3.0.shortcut.0                      |         524,288 |           447,057 |          14.73%
layer3.1.conv1                           |         262,144 |           214,255 |          18.27%
layer3.1.conv2                           |         589,824 |           459,692 |          22.06%
layer3.1.conv3                           |         262,144 |           234,345 |          10.60%
layer3.2.conv1                           |         262,144 |           213,651 |          18.50%
layer3.2.conv2                           |         589,824 |           458,166 |          22.32%
layer3.2.conv3                           |         262,144 |           234,589 |          10.51%
layer3.3.conv1                           |         262,144 |           213,339 |          18.62%
layer3.3.conv2                           |         589,824 |           457,707 |          22.40%
layer3.3.conv3                           |         262,144 |           234,370 |          10.59%
layer3.4.conv1                           |         262,144 |           213,488 |          18.56%
layer3.4.conv2                           |         589,824 |           456,744 |          22.56%
layer3.4.conv3                           |         262,144 |           234,751 |          10.45%
layer3.5.conv1                           |         262,144 |           213,085 |          18.71%
layer3.5.conv2                           |         589,824 |           455,580 |          22.76%
layer3.5.conv3                           |         262,144 |           234,267 |          10.63%
layer4.0.conv1                           |         524,288 |           425,539 |          18.83%
layer4.0.conv2                           |       2,359,296 |         1,769,880 |          24.98%
layer4.0.conv3                           |       1,048,576 |           896,161 |          14.54%
layer4.0.shortcut.0                      |       2,097,152 |         1,698,605 |          19.00%
layer4.1.conv1                           |       1,048,576 |           816,573 |          22.13%
layer4.1.conv2                           |       2,359,296 |         1,772,108 |          24.89%
layer4.1.conv3                           |       1,048,576 |           895,453 |          14.60%
layer4.2.conv1                           |       1,048,576 |           816,822 |          22.10%
layer4.2.conv2                           |       2,359,296 |         1,766,830 |          25.11%
layer4.2.conv3                           |       1,048,576 |           893,129 |          14.82%
linear                                   |           2,048 |             1,608 |          21.48%
------------------------------------------------------------------------------------------------
TOTAL PRUNABLE                           |      23,456,960 |        18,765,568 |          20.00%
========================================================

Graphical Visualization

Model Architecture Graph

Visualisasi Sparsity Bobot

Heatmap from weight matrix for sample layer. White pixel represents pruned (zeroed) weights.

conv1

Weight Sparsity for conv1

layer1.0.conv1

Weight Sparsity for layer1.0.conv1

layer1.0.conv2

Weight Sparsity for layer1.0.conv2

layer1.0.conv3

Weight Sparsity for layer1.0.conv3

layer1.0.shortcut.0

Weight Sparsity for layer1.0.shortcut.0

layer1.1.conv1

Weight Sparsity for layer1.1.conv1

layer1.1.conv2

Weight Sparsity for layer1.1.conv2

layer1.1.conv3

Weight Sparsity for layer1.1.conv3

layer1.2.conv1

Weight Sparsity for layer1.2.conv1

layer1.2.conv2

Weight Sparsity for layer1.2.conv2

layer1.2.conv3

Weight Sparsity for layer1.2.conv3

layer2.0.conv1

Weight Sparsity for layer2.0.conv1

layer2.0.conv2

Weight Sparsity for layer2.0.conv2

layer2.0.conv3

Weight Sparsity for layer2.0.conv3

layer2.0.shortcut.0

Weight Sparsity for layer2.0.shortcut.0

layer2.1.conv1

Weight Sparsity for layer2.1.conv1

layer2.1.conv2

Weight Sparsity for layer2.1.conv2

layer2.1.conv3

Weight Sparsity for layer2.1.conv3

layer2.2.conv1

Weight Sparsity for layer2.2.conv1

layer2.2.conv2

Weight Sparsity for layer2.2.conv2

layer2.2.conv3

Weight Sparsity for layer2.2.conv3

layer2.3.conv1

Weight Sparsity for layer2.3.conv1

layer2.3.conv2

Weight Sparsity for layer2.3.conv2

layer2.3.conv3

Weight Sparsity for layer2.3.conv3

layer3.0.conv1

Weight Sparsity for layer3.0.conv1

layer3.0.conv2

Weight Sparsity for layer3.0.conv2

layer3.0.conv3

Weight Sparsity for layer3.0.conv3

layer3.0.shortcut.0

Weight Sparsity for layer3.0.shortcut.0

layer3.1.conv1

Weight Sparsity for layer3.1.conv1

layer3.1.conv2

Weight Sparsity for layer3.1.conv2

layer3.1.conv3

Weight Sparsity for layer3.1.conv3

layer3.2.conv1

Weight Sparsity for layer3.2.conv1

layer3.2.conv2

Weight Sparsity for layer3.2.conv2

layer3.2.conv3

Weight Sparsity for layer3.2.conv3

layer3.3.conv1

Weight Sparsity for layer3.3.conv1

layer3.3.conv2

Weight Sparsity for layer3.3.conv2

layer3.3.conv3

Weight Sparsity for layer3.3.conv3

layer3.4.conv1

Weight Sparsity for layer3.4.conv1

layer3.4.conv2

Weight Sparsity for layer3.4.conv2

layer3.4.conv3

Weight Sparsity for layer3.4.conv3

layer3.5.conv1

Weight Sparsity for layer3.5.conv1

layer3.5.conv2

Weight Sparsity for layer3.5.conv2

layer3.5.conv3

Weight Sparsity for layer3.5.conv3

layer4.0.conv1

Weight Sparsity for layer4.0.conv1

layer4.0.conv2

Weight Sparsity for layer4.0.conv2

layer4.0.conv3

Weight Sparsity for layer4.0.conv3

layer4.0.shortcut.0

Weight Sparsity for layer4.0.shortcut.0

layer4.1.conv1

Weight Sparsity for layer4.1.conv1

layer4.1.conv2

Weight Sparsity for layer4.1.conv2

layer4.1.conv3

Weight Sparsity for layer4.1.conv3

layer4.2.conv1

Weight Sparsity for layer4.2.conv1

layer4.2.conv2

Weight Sparsity for layer4.2.conv2

layer4.2.conv3

Weight Sparsity for layer4.2.conv3

linear

Weight Sparsity for linear

Training & Validation Metrics

Loss Over Epochs

Loss Plot

Accuracy Over Epochs

Accuracy Plot

Precision Over Epochs

Precision Plot

Recall Over Epochs

Recall Plot

F1-Score Over Epochs

F1-Score Plot

Final Test Set Evaluation

Area Under ROC Curve (AUROC): 0.9986

ROC Curve

ROC Curve Plot

Confusion Matrix

Confusion Matrix Plot

Grad-CAM Analysis

Visualizing model's attention focus on sample images.

Dataset - Real

Original

Grad-CAM

Original

Grad-CAM

Original

Grad-CAM

Original

Grad-CAM

Original

Grad-CAM

Dataset - Fake

Original

Grad-CAM

Original

Grad-CAM

Original

Grad-CAM

Original

Grad-CAM

Original

Grad-CAM

SynthBuster - Fake

Original

Grad-CAM

Original

Grad-CAM

Original

Grad-CAM

Original

Grad-CAM

Original

Grad-CAM

Extracted Frames - Real

Original

Grad-CAM

Original

Grad-CAM

Original

Grad-CAM

Original

Grad-CAM

Original

Grad-CAM

Extracted Frames - Fake

Original

Grad-CAM

Original

Grad-CAM

Original

Grad-CAM

Original

Grad-CAM

Original

Grad-CAM